91 research outputs found

    Assessing the utility of an institutional publications officer: a pilot assessment

    Get PDF
    Background  The scholarly publication landscape is changing rapidly. We investigated whether the introduction of an institutional publications officer might help facilitate better knowledge of publication topics and related resources, and effectively support researchers to publish.  Methods  In September 2015, a purpose-built survey about researchers’ knowledge and perceptions of publication practices was administered at five Ottawa area research institutions. Subsequently, we publicly announced a newly hired publications officer (KDC) who then began conducting outreach at two of the institutions. Specifically, the publications officer gave presentations, held one-to-one consultations, developed electronic newsletter content, and generated and maintained a webpage of resources. In March 2016, we re-surveyed our participants regarding their knowledge and perceptions of publishing. Mean scores to the perception questions, and the percent of correct responses to the knowledge questions, pre and post survey, were computed for each item. The difference between these means or calculated percentages was then examined across the survey measures.  Results  82 participants completed both surveys. Of this group, 29 indicated that they had exposure to the publications officer, while the remaining 53 indicated they did not. Interaction with the publications officer led to improvements in half of the knowledge items (7/14 variables). While improvements in knowledge of publishing were also found among those who reported not to have interacted with the publications officer (9/14), these effects were often smaller in magnitude. Scores for some publication knowledge variables actually decreased between the pre and post survey (3/14). Effects for researchers’ perceptions of publishing increased for 5/6 variables in the group that interacted with the publications officer.  Discussion  This pilot provides initial indication that, in a short timeframe, introducing an institutional publications officer may improve knowledge and perceptions surrounding publishing. This study is limited by its modest sample size and temporal relationship between the introduction of the publications officer and changes in knowledge and perceptions. A randomized trial examining the publications officer as an effective intervention is needed

    Report on a pilot project to introduce a publications officer

    Get PDF
    First paragraph: Concerns about deficiencies in the reporting quality of biomedical research have been expressed for more than three decades. In spite of this, articles continue to pass through editorial and peer review processes and are published with critical aspects of their methods and results missing or inadequately described. Reporting biases also remain problematic. Together, these practices limit the integrity of biomedical literature and hinder reproducibility efforts. In an attempt to alleviate these problems, Moher and Altman recently proposed four potential contributory actions for journals and educational institutions to consider. Here, we present a description of our efforts to implement their first proposed action: the introduction of a publications officer

    Update on the endorsement of CONSORT by high impact factor journals: a survey of journal

    Get PDF
    The CONsolidated Standards Of Reporting Trials (CONSORT) Statement provides a minimum standard set of items to be reported in published clinical trials; it has received widespread recognition within the biomedical publishing community. This research aims to provide an update on the endorsement of CONSORT by high impact medical journals.We performed a cross-sectional examination of the onlin

    Reprint of ā€œThe Single-Case Reporting Guideline In BEhavioural interventions (SCRIBE) 2016: explanation and elaborationā€

    Get PDF
    There is substantial evidence that research studies reported in the scientific literature do not provide adequate information so that readers know exactly what was done and what was found. This problem has been addressed by the development of reporting guidelines which tell authors what should be reported and how it should be described. Many reporting guidelines are now available for different types of research designs. There is no such guideline for one type of research design commonly used in the behavioral sciences, the single-case experimental design (SCED). The present study addressed this gap. This report describes the Single-Case Reporting guideline In BEhavioural interventions (SCRIBE) 2016, which is a set of 26 items that authors need to address when writing about SCED research for publication in a scientific journal. Each item is described, a rationale for its inclusion is provided, and examples of adequate reporting taken from the literature are quoted. It is recommended that the SCRIBE 2016 is used by authors preparing manuscripts describing SCED research for publication, as well as journal reviewers and editors who are evaluating such manuscripts.Published versio

    Mapping of reporting guidance for systematic reviews and meta-analyses generated a comprehensive item bank for future reporting guidelines

    Get PDF
    Objectives: The aim of the study was to generate a comprehensive bank of systematic review (SR) reporting items to inform an update of the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) 2009 statement. Methods: We searched the Enhancing the QUAlity and Transparency Of health Research Network library in May 2019 to identify all reporting guidelines for SRs that were published after 2009, regardless of the scope of the guideline. We also conducted a selective review of four guidance manuals for SRs, three tools for assessing the risk of bias in SRs, six meta-research studies evaluating the reporting quality of SRs using a tailored checklist, and five reporting guidelines for other study designs. One author screened and selected sources for inclusion, extracted reporting guidance from sources, and mapped guidance against the PRISMA 2009 checklist items. Results: We included 60 sources providing guidance on reporting of SRs and meta-analyses. From these, we collated a list of 221 unique reporting items. Items were categorized into title (four items), abstract (10 items), introduction (12 items), methods (111 items), results (61 items), discussion (12 items), funding and conflicts of interest (four items), administrative information (three items), and data availability (four items). This exercise generated 175 reporting items that could be added to the guidance in the PRISMA 2009 statement. Conclusion: Generation of a comprehensive item bank through review and mapping of the literature facilitates identification of missing items and those needing modification, which may not otherwise be identified by the guideline development team or from other activities commonly used to develop reporting guidelines

    An international survey and modified Delphi process revealed editors' perceptions, training needs, and ratings of competency-related statements for the development of core competencies for scientific editors of biomedical journals

    Get PDF
    Background: Scientific editors (i.e., those who make decisions on the content and policies of a journal) have a central role in the editorial process at biomedical journals. However, very little is known about the training needs of these editors or what competencies are required to perform effectively in this role. Methods: We conducted a survey of perceptions and training needs among scientific editors from major editorial organizations around the world, followed by a modified Delphi process in which we invited the same scientific editors to rate the importance of competency-related statements obtained from a previous scoping review. Results: A total of 148 participants completed the survey of perceptions and training needs. At least 80% of participants agreed on six of the 38 skill and expertise-related statements presented to them as being important or very important to their role as scientific editors. At least 80% agreed on three of the 38 statements as necessary skills they perceived themselves as possessing (well or very well). The top five items on participants’ list of top training needs were training in statistics, research methods, publication ethics, recruiting and dealing with peer reviewers, and indexing of journals. The three rounds of the Delphi were completed by 83, 83, and 73 participants, respectively, which ultimately produced a list of 23 “highly rated” competency-related statements and another 86 “included” items. Conclusion: Both the survey and the modified Delphi process will be critical for understanding knowledge and training gaps among scientific editors when designing curriculum around core competencies in the future
    • ā€¦
    corecore